Character.AI Bans Teen Chats Following Tragic Incidents and Regulatory Pressure
Character.AI will disable open-ended chat features for users under 18 by November 25, shifting minors to creative tools like video and story generation. The decision follows the suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform.
The MOVE aligns with a bipartisan Senate bill aiming to criminalize AI products that groom minors or generate sexual content for children. The company cited feedback from regulators, safety experts, and parents as the impetus for the change, calling it "the right thing to do." Until the deadline, teen users face a progressively decreasing two-hour daily chat limit.
Legal pressure mounts as the platform faces lawsuits, including one from the mother of Setzer, whose death underscores the risks of unregulated AI interactions. The shift reflects broader industry scrutiny over AI's societal impact, particularly on vulnerable demographics.